Nearest Neighbor Classiication with a Local Asymmetrically Weighted Metric

نویسنده

  • Francesco Ricci
چکیده

This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classiication algorithm. It is shown both with theoretical arguments and computer experiments that good compression rates can be achieved outperforming the accuracy of the standard nearest neighbor classiication algorithm and obtaining almost the same accuracy as the k-NN algorithm with k optimised in each data set. The improvement in time performance is proportional to the compression rate and in general it depends on the data set. The comparison of the classiication accuracy of the proposed algorithm with a local symmetrically weighted metric and with a global metric strongly shows that the proposed scheme is to be preferred.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exact Learning and Data Compression with a Local Asymmetrically Weighted Metric

This paper is concerned with a local asym-metric weighting scheme for the nearest neighbor classiication algorithm and a learning procedure, based on reinforcement, for computing the weights. Theoretical results show that this context dependent metric can learn exactly certain classes of concepts storing fewer examples that those required by the Euclidean metric. Moreover, computer experiments ...

متن کامل

A Minimum Risk Metric for Nearest Neighbor Classification

nale. Retrieval in a prototype-based case library: A case study in diabetes therapy revision. CH97] C. Cardie and N. Howe. Improving minority class prediction using case-speciic feature weight. CS93] Scott Cost and Steven Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. DP97] Pedro Domingos and Michael Pazzani. On the optimality of the simple bayesian clas-si...

متن کامل

Locally Adaptive Metric Nearest Neighbor Classiication

Nearest neighbor classiication assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with nite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classiication method to try to minimize bias. We use a Chi-square...

متن کامل

Adaptive Nearest Neighbor Classiication Using Support Vector Machines

The nearest neighbor technique is a simple and appealing method to address classiication problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a nite number of examples due to the curse of dimensionality. We propose a technique that computes a locally exible metric by means of Support Vector Machines (S...

متن کامل

K Nearest Neighbor Classification with Local Induction of the Simple Value Difference Metric

The classical k nearest neighbor (k-nn) classification assumes that a fixed global metric is defined and searching for nearest neighbors is always based on this global metric. In the paper we present a model with local induction of a metric. Any test object induces a local metric from the neighborhood of this object and selects k nearest neighbors according to this locally induced metric. To in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996